15. Softmax
Multi-Class Classification and Softmax
Quiz - Softmax
The Softmax Function
In the next video, we'll learn about the softmax function, which is the equivalent of the sigmoid activation function, but when the problem has 3 or more classes.
DL 18 Q Softmax V2
Softmax Quiz
SOLUTION:
expDL 18 S Softmax
Quiz: Coding Softmax
And now, your time to shine! Let's code the formula for the Softmax function in Python.
Start Quiz:
import numpy as np
# Write a function that takes as input a list of numbers, and returns
# the list of values given by the softmax function.
def softmax(L):
pass
import numpy as np
def softmax(L):
expL = np.exp(L)
sumExpL = sum(expL)
result = []
for i in expL:
result.append(i*1.0/sumExpL)
return result
# Note: The function np.divide can also be used here, as follows:
# def softmax(L):
# expL = np.exp(L)
# return np.divide (expL, expL.sum())
User's Answer:
(Note: The answer done by the user is not guaranteed to be correct)
import numpy as np
# Write a function that takes as input a list of numbers, and returns
# the list of values given by the softmax function.
def softmax(L):
exp_l = np.exp(L)
my_sum = sum(exp_l)
my_divisor = 1.0/my_sum
#exp_l = exp_l * my_divisor
result = []
for e in exp_l:
result.append(e*1.0/my_sum)
return result
import numpy as np
def softmax(L):
expL = np.exp(L)
sumExpL = sum(expL)
result = []
for i in expL:
result.append(i*1.0/sumExpL)
return result